Logarithmic distributions prove that intrinsic learning is Hebbian
نویسندگان
چکیده
منابع مشابه
Logarithmic distributions prove that intrinsic learning is Hebbian
In this paper, we present data for the lognormal distributions of spike rates, synaptic weights and intrinsic excitability (gain) for neurons in various brain areas, such as auditory or visual cortex, hippocampus, cerebellum, striatum, midbrain nuclei. We find a remarkable consistency of heavy-tailed, specifically lognormal, distributions for rates, weights and gains in all brain areas examined...
متن کاملExcitability changes that complement Hebbian learning.
Experiments have shown that the intrinsic excitability of neurons is not constant, but varies with physiological stimulation and during various learning paradigms. We study a model of Hebbian synaptic plasticity which is supplemented with intrinsic excitability changes. The excitability changes transcend time delays and provide a memory trace. Periods of selective enhanced excitability can thus...
متن کاملIntrinsic Stabilization of Output Rates by Spike-Based Hebbian Learning
We study analytically a model of long-term synaptic plasticity where synaptic changes are triggered by presynaptic spikes, postsynaptic spikes, and the time differences between presynaptic and postsynaptic spikes. The changes due to correlated input and output spikes are quantified by means of a learning window. We show that plasticity can lead to an intrinsic stabilization of the mean firing r...
متن کاملDerivatives of Logarithmic Stationary Distributions for Policy Gradient Reinforcement Learning
Most conventional policy gradient reinforcement learning (PGRL) algorithms neglect (or do not explicitly make use of) a term in the average reward gradient with respect to the policy parameter. That term involves the derivative of the stationary state distribution that corresponds to the sensitivity of its distribution to changes in the policy parameter. Although the bias introduced by this omi...
متن کاملRobustness of Hebbian and Anti{hebbian Learning
Self{organizing neural networks with Hebbian and anti{Hebbian learning rules were found robust against variations in the parameters of neurons of the network, such as neural activities, learning rates and noisy inputs. Robustness was evaluated from the point of view of properties of soft competition for input correlations. Two models were studied: a neural network with presynaptic Hebbian learn...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: F1000Research
سال: 2017
ISSN: 2046-1402
DOI: 10.12688/f1000research.12130.1